Senior Data Operations Engineer
Kforce Inc
Job Summary
Kforce is seeking a Senior Data Operations Engineer to support a modern data stack and enable data-driven decision-making. This role involves designing, managing, and enhancing data workflows, developing new features, assisting with architectural planning, and conducting code reviews. The engineer will optimize cloud environments and data processes, acting as a technical coach for various stakeholders. Strong proficiency in SQL, Python, and experience with data warehouses and analytics tools are essential.
Must Have
- Design, manage, and enhance data workflows within a modern data stack
- Develop new features and assist with architectural planning
- Conduct code reviews to maintain quality and performance standards
- Optimize cloud environments and improve data processes for scalability and efficiency
- Act as a coach and resource for technical and non-technical stakeholders
- Bachelor's degree in Computer Science or Engineering or High School diploma/GED with 4+ years of relevant experience
- 5+ years in data roles with experience in data warehouses and analytics tools
- Strong proficiency in SQL and Python
- Hands-on experience with Snowflake, Databricks, Azure Cloud, and DBT
- Familiarity with ELT orchestration tools (e.g., Azure Data Factory, Airflow)
- Knowledge of Git and Git-based workflows
- Understanding of data modeling, warehousing, and architecture principles
- Excellent problem-solving and communication skills
- Experience with optimization and enhancement of cloud environments
- Familiarity with AWS or Google Cloud
Perks & Benefits
- Medical/dental/vision insurance
- HSA
- FSA
- 401(k)
- Life, disability & ADD insurance
- Paid time off (for salaried personnel)
- Paid sick leave (for hourly employees on a Service Contract Act project)
Job Description
Kforce has a client that is seeking a Senior Data Operations Engineer to join our dynamic team. Summary: This role is pivotal in supporting the modern data stack and enabling data-driven decision-making across the organization. You will collaborate with engineers, analysts, business users, and data scientists, providing technical expertise and guidance while ensuring efficient and optimized data workflows. Key Responsibilities:
- Design, manage, and enhance data workflows within a modern data stack
- Develop new features and assist with architectural planning alongside principal engineers
- Conduct code reviews to maintain quality and performance standards
- Optimize cloud environments and improve data processes for scalability and efficiency
- Act as a coach and resource for technical and non-technical stakeholders
Requirements
- Bachelor's degree in Computer Science or Engineering or High School diploma/GED with 4+ years of relevant experience
- 5+ years in data roles with experience in data warehouses and analytics tools
- Strong proficiency in SQL and Python
- Hands-on experience with Snowflake, Databricks, Azure Cloud, and DBT
- Familiarity with ELT orchestration tools (e.g., Azure Data Factory, Airflow)
- Knowledge of Git and Git-based workflows
- Understanding of data modeling, warehousing, and architecture principles
- Excellent problem-solving and communication skills; Ability to explain complex concepts to non-technical audiences
- Experience with optimization and enhancement of cloud environments
- Familiarity with AWS or Google Cloud